Determining the dimension in Sliced

نویسنده

  • Louis FERRE
چکیده

Sliced inverse regression and principal Hessian directions (Li, 1991, 1992) aim to reduce the dimensionality of regression problems. An important step in the method is the determination of a suitable dimension. While statistical tests based on the nullity eigenvalues are usually suggested, we here focus on the quality of the estimation of the eeective dimension reduction (edr) spaces. Essentially, our goal is to only retain suuciently stable subspaces. The goodness of the estimation is measured by the squared trace correlation between the subspaces of the edr space and their estimates. Asymptotic expansions are derived and estimates deduced. Simulations give an insight on the behaviour of the criterion and indicate how it can be used in practice.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The dr package

The dr package for R for dimension reduction regression was first documented in Weisberg (2002). This is a revision of that article, to correspond to version 3.0.0 of dr for R added to CRAN (cran.r-project.org) in Fall 2007. Regression is the study of the dependence of a response variable y on a collection of p predictors collected in x. In dimension reduction regression, we seek to find a few ...

متن کامل

Likelihood-based Sufficient Dimension Reduction

We obtain the maximum likelihood estimator of the central subspace under conditional normality of the predictors given the response. Analytically and in simulations we found that our new estimator can preform much better than sliced inverse regression, sliced average variance estimation and directional regression, and that it seems quite robust to deviations from normality.

متن کامل

Sliced Coordinate Analysis for Effective Dimension Reduction and Nonlinear Extensions

Sliced inverse regression (SIR) is an important method for reducing the dimensionality of input variables. Its goal is to estimate the effective dimension reduction directions. In classification settings, SIR is closely related to Fisher discriminant analysis. Motivated by reproducing kernel theory, we propose a notion of nonlinear effective dimension reduction and develop a nonlinear extension...

متن کامل

Sufficient dimension reduction in regressions across heterogeneous subpopulations

Sliced inverse regression is one of the widely used dimension reduction methods. Chiaromonte and co-workers extended this method to regressions with qualitative predictors and developed a method, partial sliced inverse regression, under the assumption that the covariance matrices of the continuous predictors are constant across the levels of the qualitative predictor. We extend partial sliced i...

متن کامل

Dimension Reduction Based on Canonical Correlation

Dimension reduction is helpful and often necessary in exploring nonlinear or nonparametric regression structures with a large number of predictors. We consider using the canonical variables from the design space whose correlations with a spline basis in the response space are significant. The method can be viewed as a variant of sliced inverse regression (SIR) with simple slicing replaced by Bs...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1997